Extensible hyperplane nets

نویسندگان

  • Gottlieb Pirsic
  • Friedrich Pillichshammer
چکیده

Article history: Received 21 December 2010 Revised 1 February 2011 Accepted 3 February 2011 Available online 10 February 2011 Communicated by Arne Winterhof MSC: 11K38 11K45 65C05 65D30

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Discrepancy of hyperplane nets and cyclic nets

Digital nets are very important representatives in the family of lowdiscrepancy point sets which are often used as underlying nodes for quasi-Monte Carlo integration rules. Here we consider a special sub-class of digital nets known as cyclic nets and, more general, hyperplane nets. We show the existence of such digital nets of good quality with respect to star discrepancy in the classical as we...

متن کامل

A Theorem on Conjugate Nets in Protective Hyperspace

A theorem proved by C. C. Hsiung in a recent paper [l]1 may be stated as follows : In a linear space Sn of n ( ̂ 3) dimensions let Nx be a conjugate net and w be a fixed hyperplane ; then the points M, M of intersection of the fixed hyperplane w and the two tangents at a point x of the net Nx describe two conjugate nets Nm, Nm in the hyperplane ir, respectively, and one of the two nets Nm, N¡¡ i...

متن کامل

Machine Learning: Feed-Forward Neural Nets Overview

Perceptrons. A perceptron is a linear classifier of the form y = sign(σ i=1wixi+ b) where the weights w = (w1, . . . , wd) are trained using stochastic gradient descent. A perceptron is guaranteed to converge to some hyperplane separating two classes if the two classes are linearly separable (i.e., if there exists at least one hyperplane such that all points from Class 1 are on one side of it a...

متن کامل

Generalisation in Cubic Nodes - centres and clustering First, recall the action of TLUs for comparison. The operation of an

This lecture deals with training nets of cubic nodes and introduces another major (quite general) algorithm-Reward Penalty. Insight into how we might train nets of cubic nodes is provided by considering the problems associated with generalisation in these nets. We then go on to consider feedback or recurrent nets from the point of view of their implementing iterated feedforward nets (recall thi...

متن کامل

Geometric GAN

Generative Adversarial Nets (GANs) represent an important milestone for effective generative models, which has inspired numerous variants seemingly different from each other. One of the main contributions of this paper is to reveal a unified geometric structure in GAN and its variants. Specifically, we show that the adversarial generative model training can be decomposed into three geometric st...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Finite Fields and Their Applications

دوره 17  شماره 

صفحات  -

تاریخ انتشار 2011